Sequential Testing for Sparse Recovery

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Interpretable Recurrent Neural Networks Using Sequential Sparse Recovery

Recurrent neural networks (RNNs) are powerful and effective for processing sequential data. However, RNNs are usually considered “black box” models whose internal structure and learned parameters are not interpretable. In this paper, we propose an interpretable RNN based on the sequential iterative soft-thresholding algorithm (SISTA) for solving the sequential sparse recovery problem, which mod...

متن کامل

Sparse Recovery

List of included articles [1] H. Rauhut. Random sampling of sparse trigonometric polynomials. Appl. Comput. [2] S. Kunis and H. Rauhut. Random sampling of sparse trigonometric polynomials II-orthogonal matching pursuit versus basis pursuit. [3] H. Rauhut. Stability results for random sampling of sparse trigonometric polynomi-als. [4] H. Rauhut. On the impossibility of uniform sparse reconstruct...

متن کامل

Adaptive Sampling for Sparse Recovery

Consider n data sequences, each consisting of independent and identically distributed elements drawn from one of the two possible zero-mean Gaussian distributions with variances A0 and A1. The problem of quickly identifying all of the sequences with varianceA1 is considered and an adaptive two-stage experimental design and testing procedure is proposed. The agility and reliability gains in comp...

متن کامل

Numerical methods for sparse recovery

These lecture notes are an introduction to methods recently developed for performing numerical optimizations with linear model constraints and additional sparsity conditions to solutions, i.e. we expect solutions which can be represented as sparse vectors with respect to a prescribed basis. Such a type of problems has been recently greatly popularized by the development of the field of nonadapt...

متن کامل

Sequential Sparse NMF

Nonnegative Matrix Factorization (NMF) is a standard tool for data analysis. An important variant is the Sparse NMF problem. A natural measure of sparsity is the L0 norm, however its optimization is NP-hard. Here, we consider a sparsity measure linear in the ratio of the L1 and L2 norms, and propose an efficient algorithm to handle the norm constraints which arise when optimizing this measure. ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Information Theory

سال: 2014

ISSN: 0018-9448,1557-9654

DOI: 10.1109/tit.2014.2363846